Neural Nets and RBF Nets

نویسندگان

  • Christopher James Cartmell
  • Chris Cartmell
  • Amanda Sharkey
  • Christopher Cartmell
چکیده

Declaration All sentences or passages quoted in this dissertation from other people's work have been specifically acknowledged by clear cross-referencing to author, work and page(s). Any illustrations which are not the work of the author of this dissertation have been used with the explicit permission of the originator and are specifically acknowledged. I understand that failure to do this amounts to plagiarism and will be considered grounds for failure in this dissertation and the degree examination as a whole. 1.1 Abstract This paper is experimental in nature and focuses on the strengths and weaknesses of a recently proposed boosting algorithm called AdaBoost. Starting with a literary review of boosting, we provide an in depth history of the algorithms that lead to the discovery of AdaBoost and comment on recent experimentation in this area. Boosting is a general method for improving the accuracy of a given learning algorithm and when used with neural nets, AdaBoost creates a set of nets that are each trained on a different sample from the training set. The combination of this set of nets may then offer better performance than any single net trained on all of the training data. The later half of the paper looks at what factors affect the performance of the algorithm when used with neural networks and radial basis function networks and tries to answer the following questions: (1) Is AdaBoost able to produce good classifiers when using ANN's or RBF's as base learners? (2) Does altering the number of training epochs affect the efficiency of the classifier when using ANN's or RBF's as base learners? (3) Does altering the number of hidden units have any effect? (4) How is AdaBoost affected by the presence of noise when using ANN's or RBF's as base learners and (5) What causes the observed effects? Our finding support the theory that AdaBoost is a good classifier for low noise cases but suffers from overfitting in the presence of noise. Specifically, AdaBoost can be viewed as a constraint gradient descent in an error function with respect to the margin, 1.3 Time Plan 1.3.1 Semester 1 Week1-3: Read papers on or around topic to gain a feel for what is involved and write interim report1. Week4-6: Learn how to use Matlab and experiment with AdaBoost implementation by Gunnar Raescht. Week7: Ensure that data sets to be used in empirical study are in the correct format. Week8/10: …

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Solving Fuzzy Equations Using Neural Nets with a New Learning Algorithm

Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...

متن کامل

On-line Stable Nonlinear Modelling by Structurally Adaptive Neural Nets

This paper proposes a neural net based on-line scheme for modelling discrete-time mul-tivariable nonlinear dynamical systems. Taking the advantage of structural features of RBF (Radial-Basis-Function) neural nets, the method approaches the modelling problem by setting up a coarse RBF model structure in the light of the spatial Fourier transform and spatial sampling theory, then devising appropr...

متن کامل

HMM Speech Recognition with Neural Net Discrimination

Two approaches were explored which integrate neural net classifiers with Hidden Markov Model (HMM) speech recognizers. Both attempt to improve speech pattern discrimination while retaining the temporal processing advantages of HMMs. One approach used neural nets to provide second-stage discrimination following an HMM recognizer. On a small vocabulary task, Radial Basis Function (RBF) and back-p...

متن کامل

Solving Fuzzy Equations Using Neural Nets with a New Learning Algorithm

Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...

متن کامل

Normalized Gaussian Radial Basis Function networks

Abstract: The performances of Normalised RBF (NRBF) nets and standard RBF nets are compared in simple classification and mapping problems. In Normalized RBF networks, the traditional roles of weights and activities in the hidden layer are switched. Hidden nodes perform a function similar to a Voronoi tessellation of the input space, and the output weights become the network's output over the pa...

متن کامل

A Comparative Study of MLP and RBF Neural Nets in the Estimation of the Foetal Weight and Length

Foetal weight estimation is a clinically relevant task for proper medical care in perinatal situations. Usually this estimation is based on features such as measurements derived from echographic examinations. Several formulas have been developed by other authors for performing this estimation with limited degree of success. Our approach is based on multilayer perceptrons (MLP) and radial basis ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002